Skip to content

Conversation

@kallewoof
Copy link
Contributor

Adopts code from ikawrakow/ik_llama.cpp#33 to fix the broken llama-perplexity.

Fixes #9316.

Adopts code from ikawrakow/ik_llama.cpp#33 to fix the broken llama-perplexity.
@kallewoof kallewoof force-pushed the 202508-binary-prompts branch from 0f99ce7 to b9c1aef Compare August 14, 2025 03:51
@ddh0
Copy link
Contributor

ddh0 commented Aug 14, 2025

I'm not an authority on this subject, but as I understand it, llama.cpp is hesitant to accept code that's taken from ikawrakow due to some past conflicts. Just FYI

@kallewoof
Copy link
Contributor Author

kallewoof commented Aug 14, 2025

I'm not an authority on this subject, but as I understand it, llama.cpp is hesitant to accept code that's taken from ikawrakow due to some past conflicts. Just FYI

I kind of figured. But as it stands the code is broken and it's a pain to have to juggle llamas on a per task basis. And the change is really very trivial. I could rewrite it with my own hands if that is preferred, but I doubt it would come out much different from the original.

@kallewoof
Copy link
Contributor Author

After adjusting arguments I'm no longer sure this patch is necessary so closing for now.

@kallewoof kallewoof closed this Aug 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: llama-perplexity error using multiple-choice binary data

2 participants